Goto

Collaborating Authors

 principal component projection and regression


Principal Component Projection and Regression in Nearly Linear Time through Asymmetric SVRG

Neural Information Processing Systems

Given a n-by-d data matrix A, principal component projection (PCP) and principal component regression (PCR), i.e. projection and regression restricted to the top-eigenspace of A, are fundamental problems in machine learning, optimization, and numerical analysis. In this paper we provide the first algorithms that solve these problems in nearly linear time for fixed eigenvalue distribution and large n. This improves upon previous methods which had superlinear running times when either the number of top eigenvalues or gap between the eigenspaces were large. We achieve our results by applying rational polynomial approximations to reduce the problem to solving asymmetric linear systems which we solve by a variant of SVRG. We corroborate these findings with preliminary empirical experiments.

  linear time, name change, principal component projection and regression, (2 more...)

Reviews: Principal Component Projection and Regression in Nearly Linear Time through Asymmetric SVRG

Neural Information Processing Systems

I will maintain my initial score. I did review the discussion of rLanczos and sLanczos in Appendices F.1 and F.2 and agree that several additional sentences in the main paper should suffice to clarify the methods. This is achieved using Zolotarev rational functions to approximate the sign function instead of the polynomial function approximations used in previous work. Computing the Zolotarev rational functions requires solving squared ridge-regression problems. The authors reduce solving squared ridge-regression problems to solving asymmetric linear systems and analyze convergence of a SVRG solver for generic asymmetric linear systems.


Principal Component Projection and Regression in Nearly Linear Time through Asymmetric SVRG

Neural Information Processing Systems

Given a n-by-d data matrix A, principal component projection (PCP) and principal component regression (PCR), i.e. projection and regression restricted to the top-eigenspace of A, are fundamental problems in machine learning, optimization, and numerical analysis. In this paper we provide the first algorithms that solve these problems in nearly linear time for fixed eigenvalue distribution and large n. This improves upon previous methods which had superlinear running times when either the number of top eigenvalues or gap between the eigenspaces were large. We achieve our results by applying rational polynomial approximations to reduce the problem to solving asymmetric linear systems which we solve by a variant of SVRG. We corroborate these findings with preliminary empirical experiments.

  asymmetric svrg, linear time, principal component projection and regression

Principal Component Projection and Regression in Nearly Linear Time through Asymmetric SVRG

Jin, Yujia, Sidford, Aaron

Neural Information Processing Systems

Given a n-by-d data matrix A, principal component projection (PCP) and principal component regression (PCR), i.e. projection and regression restricted to the top-eigenspace of A, are fundamental problems in machine learning, optimization, and numerical analysis. In this paper we provide the first algorithms that solve these problems in nearly linear time for fixed eigenvalue distribution and large n. This improves upon previous methods which had superlinear running times when either the number of top eigenvalues or gap between the eigenspaces were large. We achieve our results by applying rational polynomial approximations to reduce the problem to solving asymmetric linear systems which we solve by a variant of SVRG. We corroborate these findings with preliminary empirical experiments.